федеральное государственное автономное образовательное учреждение высшего образования
«Самарский национальный исследовательский университет имени академика С.П. Королева»
Samara Scientists Developed an Artificial Intelligence Dystem with Stereo Vision for UAVs

Samara Scientists Developed an Artificial Intelligence Dystem with Stereo Vision for UAVs

Самарский университет

The development will make drones more independent and safer

06.06.2025 1970-01-01

Scientists of Samara University has developed a navigation software package with artificial intelligence and a stereo vision system for unmanned aerial vehicles (UAVs). The development, called Navigator, allows drones to follow preset routes offline, without human intervention, bypassing no-fly zones and reacting independently to obstacles that suddenly appear ahead – for example, in the form UAVs or birds flying towards them.

The software package also helps drones to land safely in an emergency situation: the onboard neural network analyzes images from video cameras and determines whether there are people, cars, or any dangerous objects at the place of the intended landing and whether the terrain is suitable enough for landing. In addition, Navigator ensures the safe collaboration of several smart drones within a given area of the territory, for example, during mass processing of crops using UAVs.

The neural network navigation system was created by order of the university's industrial partner, Transport of the Future, a company engaged in the development and production of unmanned aircraft systems. A certificate of registration of a computer program with Rospatent was received for the development.

“The Navigator system is a software platform for ensuring the safety of navigation of unmanned aircraft systems using artificial intelligence and technical vision. It solves several important tasks at once. Firstly, it is the segmentation of flight zones, the division of space into transport corridors, and the allocation of no-fly zones. That is, for example, agricultural drones with a Navigator on board “understand” that these areas of crops in the field need to be watered and processed, but these are not. In addition, the program does not allow drones to accidentally fly out of the cultivated field, which sometimes happens with agricultural drones. It also implements detection of moving and stationary obstacles and avoidance of collisions with them. Navigator can plan flights of several drones in a single flight zone at once so that they do not interfere with each other, and ensure safety during an emergency landing in an unplanned place, that is, the system determines from the image from the video camera that there are no people, animals, cars and so on in the landing zone,” said professor Artyom Nikonorov, Director of the Institute of Artificial Intelligence and Head of the Center for Intelligent Mobility of Multifunctional Unmanned Aircraft Systems at Samara University.

Before the flight, the pilot or technician enters the necessary data into the system in advance on a portable remote control or ground control station. It indicates the working and forbidden zones, with what intensity certain areas of the field need to be sprayed, and the output is a ready-made route along which the drone flies automatically.

The Navigator runs on a Jetson Nano single-board microcomputer: you can't put a big powerful computer on a UAV. According to the developers, they managed to find the best compromise solution in terms of the quality and speed of the vision algorithms with minimal power consumption of the system.

One of the interesting features of the Navigator was its “stereo vision” – the ability of the software package to work with video streams simultaneously received from two video cameras spaced some distance apart, like the eyes of living creatures. Drones are usually content with monovision, even if there are several cameras on board: for example, one looks forward, another looks down, and the third looks back. Monovision is like looking with only one eye.

“We tried using a stereo camera so that the Navigator could not only see objects, but also determine the distance to them accurately enough. Stereo vision allows you to determine the exact distance to an object at a distance of several tens of meters with an accuracy of up to a centimeter, this is necessary in order to understand how dangerously close the drone is to this object. Also, with the help of stereo vision, it is possible to better determine the relief in the emergency landing zone when you need to choose the flattest surface, the stereo camera is very well suited for this,” emphasized Artem Nikonorov.

Another benefit of the Navigator is that it helps the drone navigate if the GPS signal is suddenly lost. The navigator segments in advance, splits the image of the processed field into many fragments, and the neural network can then compare the current image from the camera with these fragments to determine the location.

“It is quite difficult to solve this problem solely with the help of a camera that looks down, because agricultural fields over a large area can look exactly the same. However, almost all fields outside of them have some kind of external landmarks: a forest belt, a road, a power line. Thereby, it is possible to identify and prevent a possible departure of a drone outside the field being processed. Based on a piece of the camera image, the system searches for this place in the close-up, which was previously segmented and uploaded to the system, and as a result, the drone can “understand” where it is located. The positioning accuracy with this method can reach about 1 m,” said Artyom Nikonorov.

The Navigator's vision module is currently being tested. According to the developers, the system will be supplemented in the future with other useful software modules responsible for various areas of the future domestic ecosystem of safety of unmanned aircraft systems.

For reference:

In 2023, Samara University became one of the winners of the federal competition for the support of Russian research centers in the field of artificial intelligence (AI). The competitive selection was performed by the Analytical Center under the Government of Russia. According to the contest’s results, more than 850 mln rubles is allocated to the University in three years on the co-financing basis, including 600 mln rubles of budgetary subsidies and over 200 mln rubles from industrial partners. The money will be spent on design of applied solutions in the field of artificial intelligence to develop the national unmanned aviation industry.

One of priority projects implemented in the framework of such government support is development of the Russia’s first safety ecosystem for home made unmanned aerial vehicles in the Samara University. The work on this project is performed by scientists of the University’s Center for Intelligent Mobility of Multifunctional Unmanned Aviation Systems. Neural network-based development allows adjusting and automating drone flights and minimizing the number of potential emergency situations on the surface and in the air. In the long run, such a united ecosystem will help to set high safety standards for the entire unmanned aviation industry in our country. The scientists implement this project together with Transport of the Future — the industrial partner of the University.

Photo: Olesya Orina